The robots.txt file is a simple text file placed on your website’s root directory. It tells search engine crawlers which parts of your website they are allowed or not allowed to crawl and index. This is an essential tool for managing your site's SEO and privacy.
User-agent
and Disallow
directives.https://yourdomain.com/robots.txt
).Here’s a simple example:
# Block all web crawlers from all content
User-agent: *
Disallow: /
# Allow all web crawlers access to everything
User-agent: *
Disallow:
# Block specific folder
User-agent: *
Disallow: /private-directory/
# Block Googlebot from accessing /test/
User-agent: Googlebot
Disallow: /test/
User-agent: *
Disallow: /private/
Allow: /private/allowed-file.html
Include your sitemap in robots.txt to help crawlers find it easily:
Sitemap: https://yourdomain.com/sitemap.xml
Visit the official guide: Google’s Robots.txt documentation
Discovered by Tasin mail: tsas0640@gmail.com